Variable Selection for Support Vector Machines via Smoothing Spline Anova

نویسندگان

  • Hao Helen Zhang
  • HAO HELEN ZHANG
چکیده

It is well-known that the support vector machine paradigm is equivalent to solving a regularization problem in a reproducing kernel Hilbert space. The squared norm penalty in the standard support vector machine controls the smoothness of the classification function. We propose, under the framework of smoothing spline ANOVA models, a new type of regularization to conduct simultaneous classification and variable selection in the SVM. The penalty functional used is the sum of functional component norms, which automatically applies soft-thresholding operations to functional components, hence yields sparse solutions. We suggest an efficient algorithm to solve the proposed optimization problem by iteratively solving quadratic and linear programming problems. Numerical studies, on both simulated data and real datasets, show that the modified support vector machine gives very competitive performances compared to other popular classification algorithms, in terms of both classification accuracy and variable selection.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Support Vector Machines, Reproducing Kernel Hilbert Spaces and the Randomized Gacv 1 1 Support Vector Machines, Reproducing Kernel Hilbert Spaces and the Randomized Gacv

This chapter is an expanded version of a talk presented in the NIPS 97 Workshop on Support Vector Machines. It consists of three parts: (1) A brief review of some old but relevant results on constrained optimization in Reproducing Kernel Hilbert Spaces (RKHS), and a review of the relationship between zero-mean Gaussian processes and RKHS. Application of tensor sums and products of RKHS includin...

متن کامل

Smoothing Spline ANOVA Models II. Variable Selection and Model Building via Likelihood Basis Pursuit

We describe Likelihood Basis Pursuit, a nonparametric method for variable selection and model building, based on merging ideas from Lasso and Basis Pursuit works and from smoothing spline ANOVA models. An application to nonparametric variable selection for risk factor modeling in the Wisconsin Epidemiological Study of Diabetic Retinopathy is described. Although there are many approaches to vari...

متن کامل

Variable Selection in Bayesian Smoothing Spline ANOVA Models: Application to Deterministic Computer Codes

With many predictors, choosing an appropriate subset of the covariates is a crucial, and difficult, step in nonparametric regression. We propose a Bayesian nonparametric regression model for curve-fitting and variable selection. We use the smoothing spline ANOVA framework to decompose the regression function into interpretable main effect and interaction functions. Stochastic search variable se...

متن کامل

Relationships between Gaussian processes, Support Vector machines and Smoothing Splines

Bayesian Gaussian processes and Support Vector machines are powerful kernel-based methods to attack the pattern recognition problem. Probably due to the very different philosophies of the fields they have been originally proposed in, techniques for these two models have been developed somewhat in isolation from each other. This tutorial paper reviews relationships between Bayesian Gaussian proc...

متن کامل

Variable Selection for Nonparametric Quantile Regression via Smoothing Spline AN OVA.

Quantile regression provides a more thorough view of the effect of covariates on a response. Nonparametric quantile regression has become a viable alternative to avoid restrictive parametric assumption. The problem of variable selection for quantile regression is challenging, since important variables can influence various quantiles in different ways. We tackle the problem via regularization in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006